SKIP TO CONTENT

Markoff process

Definitions of Markoff process
  1. noun
    a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state
    synonyms: Markov process
    see moresee less
    types:
    Markoff chain, Markov chain
    a Markov process for which the parameter is discrete time values
    type of:
    stochastic process
    a statistical process involving a number of random variables depending on a variable parameter (which is usually time)
Cite this entry
Style:
MLA
  • MLA
  • APA
  • Chicago

Copy citation
DISCLAIMER: These example sentences appear in various news sources and books to reflect the usage of the word ‘Markoff process'. Views expressed in the examples do not represent the opinion of Vocabulary.com or its editors. Send us feedback
Word Family